Learning to Anticipate Egocentric Actions by Imagination

نویسندگان

چکیده

Anticipating actions before they are executed is crucial for a wide range of practical applications, including autonomous driving and robotics. In this paper, we study the egocentric action anticipation task, which predicts future seconds it performed videos. Previous approaches focus on summarizing observed content directly predicting based past observations. We believe would benefit if could mine some cues to compensate missing information unobserved frames. then propose decompose into series feature predictions. imagine how visual changes in near labels these imagined representations. Differently, our ImagineRNN optimized contrastive learning way instead regression. utilize proxy task train ImagineRNN, i.e., selecting correct states from distractors. further improve by residual anticipation, changing its target difference adjacent frames frame content. This promotes network target, action, as between features more important forecasting future. Extensive experiments two large-scale datasets validate effectiveness method. Our method significantly outperforms previous methods both seen test set unseen EPIC Kitchens Action Anticipation Challenge.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unsupervised Learning of Deep Feature Representation for Clustering Egocentric Actions

Popularity of wearable cameras in life logging, law enforcement, assistive vision and other similar applications is leading to explosion in generation of egocentric video content. First person action recognition is an important aspect of automatic analysis of such videos. Annotating such videos is hard, not only because of obvious scalability constraints, but also because of privacy issues ofte...

متن کامل

Encouraging LSTMs to Anticipate Actions Very Early Supplementary Material

In this supplementary material, we analyze different aspects of our approach via several additional experiments. While the main paper discusses action anticipation, here, we focus on evaluating our approach on the task of action recognition. Therefore, we first provide a comparison to the state-of-the-art action recognition methods on three standard benchmarks, and evaluate the effect of exploi...

متن کامل

Humans Anticipate the Goal of other People’s Point-Light Actions

This eye tracking study investigated the degree to which biological motion information from manual point-light displays provides sufficient information to elicit anticipatory eye movements. We compared gaze performance of adults observing a biological motion point-light display of a hand reaching for a goal object or a non-biological version of the same event. Participants anticipated the goal ...

متن کامل

Learning to Anticipate the Movements of Intermittently Occluded Objects

A model of event driven anticipatory learning is described and applied to a number of attention situations where one or several visual targets need to be tracked while being intermittently occluded. The model combines covert tracking of multiple targets with overt control of a single attention focus. The implemented system has been applied to both a simple scenario with a car that is occluded i...

متن کامل

Learning to anticipate a temporarily hidden moving object

In this paper we provide a robot with the ability to anticipate the location of reappearance of a moving target, usually a ball, which is temporarily hidden behind a wall. The images taken by the robot are simplified and transformed into a smaller number of sector views, whereby each sector is assigned to one of the states ”target”, ”wall” or ”background”. Based on the current observed sector v...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE transactions on image processing

سال: 2021

ISSN: ['1057-7149', '1941-0042']

DOI: https://doi.org/10.1109/tip.2020.3040521